Goto

Collaborating Authors

 single-board computer


An Evaluation of LLMs Inference on Popular Single-board Computers

Tung, null, Nguyen, null, Nguyen, Tuyen

arXiv.org Artificial Intelligence

The growing demand for on-device large language model (LLM) inference is driving interest in deploying lightweight, cost-effective AI solutions on edge hardware. Single-board computers (SBCs) such as the Raspberry Pi and Orange Pi offer a promising platform for localized, privacy-preserving inference-but remain underexplored in the context of LLM workloads. In this work, we benchmark the performance of 25 quantized open-source LLMs across three SBCs-Raspberry Pi 4, Raspberry Pi 5, and Orange Pi 5 Pro-using two inference runtimes: Ollama and Llamafile. We evaluate generation throughput, memory usage, and power consumption under varying CPU configurations, using multiple prompt types to simulate realistic workloads. Our results show that SBCs can reliably support models up to 1.5B parameters, with Llamafile achieving up to 4x higher throughput and 30-40% lower power usage than Ollama. We identify architecture-specific bottlenecks, highlight runtime-level trade-offs, and provide practical deployment recommendations. This study offers the first broad evaluation of LLM inference on SBCs, bridging the gap between high-performance language models and affordable edge computing.


On-Device AI: TensorFlow, PyTorch, or In-House -- Picovoice

#artificialintelligence

There is no shortage of articles discussing which deep learning framework is the best. In this article, we want to focus on a niche. Which framework can make your life easier if your goal is On-Device Deployment? We also explore the controversial topic of building your in-house on-device Inference Engine. TensorFlow comes with TensorFlow Lite for Android, iOS, and single-board computers (e.g.


Level Up Your AI Skillset and Dive Into The Deep End Of TinyML

#artificialintelligence

Machine learning (ML) is a growing field, gaining popularity in academia, industry, and among makers. We will take a look at some of the available tools to help make machine learning easier, but first, let's review some of the terms commonly used in machine learning. John McCarthy provides a definition of artificial intelligence (AI) in his 2007 Stanford paper, "What is Artificial Intelligence?" In it, he says AI "is the science and engineering of making intelligent machines, especially intelligent computer programs." This definition is extremely broad, as McCarthy defines intelligence as "the computational part of the ability to achieve goals in the world." As a result, any program that achieves some goal can easily be classified as artificial intelligence. In her article "Machine Learning on Microcontrollers" (Make: Vol.


Tensorflow 2.0 for Edge TPU Programming

#artificialintelligence

Introduction to Tensorflow 2.0 and using it with TPUs to develop Edge Programming. Tensorflow is an end-to-end open source platform for machine learning. It has a comprehensive, flexible ecosystem of tools, libraries and community resources that lets researchers push the state-of-the-art in ML and developers easily build and deploy ML powered applications. Coral helps you bring on-device AI application ideas from prototype to production. It offers a platform of hardware components, software tools, and pre-compiled models for building devices with local AI.


Meet the CuBox-M, a tiny 2-inch PC built for developers and makers

PCWorld

If you're a tinkerer who needs a new machine for machine learning or exotic maker creations, the diminutive CuBox-M Micro Desktop PC from Israel-based SolidRun might be up your alley. This addition to the CuBox line is designed specifically for application development and maker projects like a smart home hub. The CuBox-M is based on the i.MX 8M Plus system-on-module (SoM) from NXP Semiconductors. It features up to a quad-core Cortex A53 CPU with a Cortex M7 core, plus a Cadence Tesilica HiFi 4 digital signal processor for voice and natural language tasks. There's also an integrated neural processing unit for AI and machine learning, and optional power-over-ethernet. The SoM is integrated into a carrier board boasting a number of ports, including one HDMI 2.0, one ethernet, two USB 3.0, one microUSB, and a microSD port for storage.


SolidRun takes on Google's Raspberry Pi-like computer

#artificialintelligence

Israeli edge-computing outfit SolidRun has launched a new lineup of Raspberry Pi-like computers based on NXP's new i.MX 8M Plus application processor. SolidRun makes edge computing kit containing Arm-based and Intel chips. Earlier this year, it teamed up with application-specific integrated circuit (ASIC) chip manufacturer Gyrfalcon Technology to build the Arm-based, Linux Janux GS31 AI inference server. Now the company has launched three new single-board computers powered by NXP's i.MX 8M Plus application processors. They're aimed at the same industrial market Raspberry Pi is targeting beyond its traditional education purposes – and which Google is also targeting with its line of Coral-branded single-board computers.


Google's Raspberry Pi-like Coral board lands: Turbo-charged AI on a tiny computer ZDNet

#artificialintelligence

Developers can now get their hands on Google's souped-up answer to the Raspberry Pi: the $150 Coral Dev Board, which features Google's Edge TPU machine-learning accelerator for low-powered devices that sit on the edge of a network. Google unveiled the tiny Edge TPU ASIC last July as its low-cost chip for bringing machine learning to sensors that can run machine-learning models on the TensorFlow lite framework. The Edge TPU now features in the Coral-branded $75 USB'thum bdrive' accelerator and as part of a removable'system on module' that ships with a developer baseboard. The Edge TPU Module includes an NXP i.MX 8M system on chip that consists of a quad-core Cortex-A53 and Cortex-M4F, a Vivante GC7000 Lite Graphics graphics processor, 8GB of eMMC storage, and 1GB of LDDR4 RAM. The baseboard has a RPi-like 40-pin GPIO expansion header, microSD slot for flash memory, USB ports, Gigabit Ethernet port, USB 2.0 and 3.0 ports for power and peripherals, a 3.5mm audio jack, and a terminal to wire up stereo speakers.


BeagleBone AI supercharges machine learning and computer vision on Raspberry Pi-style board

#artificialintelligence

The newly revealed BeagleBone AI is a board aimed at developers interested in experimenting with machine-learning and computer vision. Unveiled this week, the computer houses four dedicated chips originally designed to help self-driving cars "see" the world around them. The board's Texas Instruments (TI) Embedded Vision Engine (EVE) chips offer up to 8x the performance per watt when running calculations for computer-vision models compared to running on an Arm Cortex A15-based CPU. This optimized hardware is accessible to developers via the TI Deep Learning OpenCL (Open Computing Language) API. Foundation say the BeagleBone AI board will be able to automate tasks in industrial, commercial and home settings.